The Week That Was
Oct. 9, 2004
1. New on the Web: OUR NUCLEAR PHYSICS GURU, DR. GORDON PRATHER, DESCRIBES A PROMISING OPTION FOR NUCLEAR POWER: RUSSIAN PLANS FOR REACTORS MOUNTED ON FLOATING BARGES THAT CAN ALSO SUPPLY DESALINATION TO AREAS SHORT OF FRESH WATER.

2. FRESH WATER: INCREASE IN RAINFALL FROM GLOBAL WARMING

3. ENVIRONMENT IMPROVES UNDER BUSH

4. COAL GENERATORS CLEAN UP THEIR ACT

5. GLOBAL WARMING FANATICS FEAR SOUND SCIENCE.

6. SCIENCE SUPPORT FOR GLOBAL WARMING DWINDLES: "HOCKEYSTICK" BROKEN AGAIN

7. CONSENSUS CAN BE BAD FOR CLIMATE SCIENCE

8. BRITAIN PLANS CONTENTIOUS APPROACH TO CLIMATE-CHANGE POLICIES.

9. AN ADMIRAL AT THE HELM OF NOAA
**********************
**********************

2. Fresh Water: Increase in Rainfall from Global Warming
WJR Alexander, Professor Emeritus, Department of Civil and Biosystems Engineering, University of Pretoria, South Africa. Email alexwjr@iafrica.com

What is the most important consequence of global warming? It is not the melting of the Arctic icecap or glaciers. Nor is it the damage to the natural environment -- coral reefs, wetlands, flora and fauna -- important as they may be. Nor, as I will show, is it the threat of floods, droughts and tropical diseases. Nor is it an increase in temperature, which will be adverse in some regions and beneficial in others.

The most important consequence is the beneficial increase in rainfall for the hundreds of millions of people of the globe who rely on subsistence agriculture and ephemeral river flow for their survival. Why was an increase in rainfall relegated to a minor consequence in the IPCC's Summary for Policymakers published in 2001? Honesty in science requires that scientists should not deliberately suppress information that is in conflict with their views. Presenting both the pros and cons of global warming should be the overriding criterion for the IPCC in particular as it is an international and impartial body.

As I will show, there is a very large sector of the global community who have no faith in the dire predictions of climatologists. The main reason for their indifference is the level of rhetoric, unbalanced views, political connotations (local, national and international), lack of substance to many of the claimed adverse effects of climate change, and omission of references to beneficial changes that are of most importance to the welfare of humankind, namely rainfall and river flow.
************************


3. Environment Improves Under Bush

Environmental activists have accused President Bush of being America's "worst environmental president," yet the empirical evidence suggests the environment is getting cleaner, says Steven F. Hayward of the American Enterprise Institute.

Each year, the AEI and the Pacific Research Institute compile an annual Index of Leading Environmental Indicators. The 2004 Report shows that the environment has consistently improved since the 1980s.

O Between 1976 and 2002, ambient air pollution decreased significantly; ozone levels declined by 31 percent, sulfur dioxides by 71 percent, carbon monoxide by 75 percent, and nitrogen dioxide by 41 percent.

O When measuring emissions on a per capita basis or per dollar of gross domestic product (GDP), the United States experienced emission declines on par with Europe between 1982 and 1998.

O The percentage of the population served by water supplies with no reported violations of health-based standards increased from 79 percent in 1993 to 94 percent in 2002.

O Since 1988, so-called toxic releases have declined 60 percent cumulatively, with a 90-percent decline in dioxin since 1970.

Furthermore, the number of new species listed under the Endangered Species Act has declined substantially between 1996 and 2003, likely due to private efforts in species conservation.

The only environmental trend that has not been positive is the management of public lands. Between 90 and 200 million acres of public land are at risk for catastrophic fires. Public parks are experiencing billions of dollars in maintenance backlogs. Yellowstone Park, for example, is contaminated with sewage. Holly Lipke Fretwell of the Property and Environment Research Center recommends allowing states to manage land and the public to lease land and resources.

Source: Steven F. Hayward, et al., "2004 Index of Leading Environmental Indicators, Ninth Edition," American Enterprise Institute, July 2004.
========================

EPA Cites Decline in Principal Air Pollutants:


EPA Administrator Michael Leavitt recently released a report showing a decrease in aggregate emissions of six principal air pollutants in 2003. According to BNA's Daily Environment Report, EPA's Air Emissions Trends Report documents that emissions of carbon monoxide, nitrogen oxides, sulfur dioxide, particulate matter, volatile organic compounds, and lead have been cut from 301.5 million tons per year in 1970 to 147.8 million tons in 2003..
**************************


4. Coal Generators Clean Up Their Act
By Ken Silverstein Director, Energy Industry Analysis

The nation must pursue a pragmatic energy policy. A real question exists as to whether natural gas producers will gain more access to restricted lands and whether the wells they drill will satisfy America's long-term, growing energy needs. Given that dilemma and given that the United States has vast coal reserves, many energy analysts are endorsing the advancement of clean coal technologies.

"In order to become less dependent on foreign sources of energy, we've got to find and produce more energy at home, including coal," says President Bush. "I believe that we can have coal production and enhanced technologies in order to make sure the coal burns cleaner. I believe we can have both." The Democratic nominee, Sen. John Kerry of Massachusetts, agrees that "clean coal" is essential to the country's energy picture. He would commit $10 billion over 10 years to the effort.

Some clean coal technologies offer the potential for giving even high-sulfur "dirty" coals many of the same environmental qualities of natural gas. Others also greatly reduce greenhouse gas emissions by boosting power plant efficiencies and releasing carbon gases in a form that can be more easily captured and prevented from entering the atmosphere.

The Clean Air Act of 1990 requires that sulfur dioxide emissions be capped at 8.9 million tons a year and that nitrogen oxide emissions be limited to 2 million tons annually starting in 2008, which is about an 85 percent reduction over what is currently allowable. Altogether, between 1970 and 1998, sulfur dioxide emissions dropped by 76 percent, nitrogen oxide emissions fell by 58 percent and particulate matter declined by 96 percent, says the Environmental Protection Agency.

The new technologies now available hold out even more promise. Take Reliant Energy's Seward plant, which is located 80 miles east of Pittsburgh. It is the largest waste-coal-fired generating plant in the world and is the only merchant plant of its kind in the U.S. The plant is currently undergoing pre-operational testing, with commercial operation expected this fall. Seward is projected to be one of the lowest-cost generating plants in the PJM Interconnection. The new 521-megawatt facility produces two and one-half times as much electricity as the plant it replaces, while significantly reducing emissions.

In addition, Seward uses low-grade refuse from coalmines, which is abundant in western Pennsylvania. Removing this waste coal improves the environment, while removing a significant source of acid discharge from the local watershed. In all, more than 100 million tons of waste coal will be removed from the landscape during the life of the plant.

Altogether, the Bush administration has committed $2 billion over 10 years to its Clean Coal Power Initiative. In the last round of bids, 13 companies submitted proposals to the administration valued at $6 billion. Winners will be selected by January. In the first round, the administration chose eight projects to help fund. Among the current competitors: Basin Electric Power Cooperative, Southern Co. Services and Minnesota Power.

Coal may be controversial but it's here to stay. At about a third of the cost of natural gas, coal has 200 years' worth of reserves. The Energy Information Administration now projects that coal will actually increase its share of the U.S. electricity generation market from 50 percent in 2002 to 52 percent by 2025, because of the expected increased demand for electricity. This year, as the economy gathers steam, the demand for power is anticipated to rise 2.4 percent. Meantime, UtiliPoint International research finds that there are more than 90 coal plants with a total capacity of 50 gigawatts now under consideration.

American Electric Power, for example, said it expects to construct the largest clean-coal facility in the country. It announced this month that it wants to build a 1,000-megawatt plant in the next six years that runs on "integrated gasification combined cycle" technology. Basically, that IGCC facility would convert coal to synthetic natural gas - an investment that would cost $1.6 billion, or 20 percent more per kilowatt than other modern coal-fired plants.

Xcel Energy says that it expects a "significant portion" of its new generation to be coal-fired, all to combat the high price of natural gas and to grow its asset base that will help improve its credit ratings. The Wisconsin Public Service Commission has given final approval to Wisconsin Energy Corp. to build two 615-megawatt coal plants at a cost of $2.15 billion. The regulated plants are guaranteed a 12.7 percent return on equity. A third plant-one that would run cleaner than conventional plants and also be in included in the rate base-was rejected. The commission noted that those technologies are expensive and unproven.

Not only does Wisconsin Energy say that it will pursue the third plant but it also says that without the coal alternative, it would be unable to economically meet future demand that is expected to be 2-3 percent a year. Opponents, however, are trying to stop construction of the coal plants-even though Wisconsin Energy says that it will retire two older coal facilities that are embroiled in federal lawsuits. The state's department of natural resources must still approve the plan.

Today's coal-fired plants have a fuel-efficiency rate of 33-35 percent. With the new technologies, such as gasification, however, that efficiency rate is said to increase to 40-50 percent, and potentially as much as 60 percent. Their cost is about $1,200 to $1,600 per kilowatt compared to $900 with conventional coal plants.

Many utilities are looking closely at other types of modern devices that can make coal cleaner. Right now, about 140 scrubbers that cleanse coal of harmful pollutants are located at the 540 coal plants. AEP says that it will spend $1.2 billion over three years to install scrubbers that will reduce its sulfur dioxide emissions. Meanwhile, Duke is retrofitting an existing coal-fired facility that will reduce such emissions by 90 percent. Altogether, the company will spend $1.5 billion to update its technologies.

Environmentalists argue that the most effective solution is to phase out existing coal plants that cause most of the pollutants and to emphasize green power such as wind, solar and hydrogen. To that end, the current president has emphasized fossil-fuel production while his opponent argues that a far greater emphasis ought to be placed on finding newer, cleaner and more sustainable energy forms. Both candidates, however, want to see new clean-coal technologies reach the commercial stage.
***************************


5. Global Warming Fanatics Fear Sound Science.

The Wall Street Journal pointedly notes that global warming alarmists must realize that the evidence increasingly demonstrates that "greenhouse gases" are not causing expected global warming. And how is this manifest? Because they want to be exempted from the Data Quality Act:


"We've long been skeptics about the science behind the political campaign to regulate greenhouse gasses, so imagine our surprise to discover that some of the global warmists seem to agree. How else to read a paragraph that was included in a recent Senate spending bill exempting climate programs from having to pass scientific scrutiny? The legislative language excuses any "research and data collection, or information analysis conducted by or for the National Oceanic and Atmospheric Administration" (the agency charged with monitoring climate change) from the Data Quality Act, a new law that requires sound science in policymaking. This is the sole exemption in the bill. ...our sources say it was included at the request of Democrats on the Senate subcommittee that wrote the spending bill in question, but that now the exemption is getting the attention of Chairman Judd Gregg, who says he intends to remove it. Let's hope so. Surely those who claim to believe most in climate change aren't afraid to subject their theories to even basic tests of scientific accuracy. Or are they?
http://online.wsj.com/article/0,,SB109641505795030748,00.html?mod=opin ion (subscription required)

******************************


6. "Hockeystick" Broken Again

A major IPCC claim is that the 20th century was the warmest in the past 1000 years - and therefore "proves" a human influence. It pretends to show a steadily decreasing temperature from about 1000AD till 1850, when it suddenly turns up sharply -- presumably because of greenhouse warming. Hence the name "hockeystick Never mind that all this is wishful thinking and proves nothing. It isn't even true!

Our readers know that this claim is based mainly on a single and likely flawed study, published in Nature in 1998 by Mann, Bradley and Hughes. While it has been stalwartly defended by Global Warming apologists, we have realized for several years that the work is suspect. First, every other analyst of proxy data has concluded that there was a Medieval warming period likely warmer than the 20th century. This evidence was most recently reviewed by Soon and Baliunas, who have been widely attacked for their temerity. (How dare they doubt the IPCC?) In Nov 2003, McIntyre and McKitrick pointed out that the actual data used by Mann et al had been mann-handled, suggesting that they had been doctored. Mann has been forced to publish a Corrigendum in Nature but the dispute is not yet settled.

Now one of IPCC's icons, German climate modeler Hans von Storch has published a bombshell paper that breaks the hockeystick into little pieces. We publish here different perspectives on this paper.


1) First, v Storch himself, in an Oct 10 interview in the German weekly Der Spiegel (41, Oct 4,-2004, p158), he comments on the dispute between scientists concerning the temperature curve of the last thousand years and the greenhouse effect.

"The Mann graph indicates that it was never warmer during the last thousand years than it is today. In a near perfect slope the curve declines from the Middle Ages up to 1800, only to shoot up sharply with the beginning of fossil-fuel burning. Mann's calculations rest, inter alia, on analyses of tree rings and corals. We were able to show in a publication in 'Science' that this graph contains assumptions that are not permissible. Methodologically it is wrong: rubbish."

[SFS A better translation of "Quatsch" might be "Junk"]

"His influence in the community of climate researchers is great. And Mann rejects any reproach most forcefully. His defensiveness is understandable. Nobody likes to see his own child die. But we must respect our credibility as research scientists. Otherwise we play into the hands of those sceptics of global climate change who imagine a conspiracy between science and politics."
=======================

And now, excerpts from the NYT

New research questions uniqueness of recent warming
By ANDREW C. REVKIN, New York Times, 5 October 2004

A new analysis has challenged the accuracy of a climate timeline showing that recent global warming is unmatched for a thousand years. That timeline, generated by stitching together hints of past temperatures embedded in tree rings, corals, ice layers and other sources, is one strut supporting the widely accepted view that the current warm spell is being caused mainly by accumulating heat-trapping smokestack and tailpipe emissions.

The authors of the study, published in the current issue of the online journal ScienceExpress, did not dispute that a sharp warming was underway and that its pace could signal a human influence. But they said their test of the methods used to mesh recent temperature records with centuries-old evidence showed that past natural climate shifts were most likely sharply underestimated.

Many climate scientists credited the new study with pointing out how much uncertainty still surrounds efforts to turn nature's spotty, unwritten temperature records into a climate chronology.

An accompanying commentary in ScienceExpress, by Dr. Timothy J. Osborn and Dr. Keith R. Briffa, scientists at the University of East Anglia in Britain, said it implied that "the extent to which recent warming can be viewed as 'unusual' would need to be reassessed."

The significance of the new analysis comes partly because the record it challenges is a central icon in the debate over whether heat-trapping emissions should be curbed. The hallmark of the original method is a graph widely called the "hockey stick" because of its shape: a long, relatively unwavering line crossing the last millennium and then a sharp, upward-turning "blade" of warming over the last century.

But many climate sleuths acknowledged that while the broad climate trends were clear, much remained uncertain.

"I don't think anyone in the field would doubt we may be underestimating" some past climate shifts, said Dr. Raymond S. Bradley, a University of Massachusetts climate expert and co-author of Dr. Mann's. "For the general point von Storch is making," he added, "fair enough."
----------------------------
[SFS: Revkin fails to tell the NYTimes readers that the hockeystick was the last remaining strut holding up the IPCC hoax of manmade Global Warming]
====================

And now, the analysis of the Idso group:
The Broken Hockey Stick: Problems Are Piling Up With Flawed Methodology
CO2 Science Magazine, 6 October 2004
http://www.co2science.org/journal/v7/v7n40c1.htm

Reference: von Storch, H., Zorita, E., Jones, J., Dimitriev, Y, Gonzalez-Rouco, F. and Tett, S. 2004. Reconstructing past climate from noisy data. www.scienceexpress.org,30 September.

What was done
The authors used a coupled atmosphere-ocean model simulation of the climate of the past millennium as a surrogate climate to test the skill of the empirical reconstruction methods used by Mann et al. (1998, 1999) in deriving their thousand-year "hockeystick" temperature history of the Northern Hemisphere (NH). This they did by (1) generating a number of pseudo-proxy temperature records by sampling a subset of the model's simulated grid-box temperatures representative of the spatial distribution of the real-world proxy temperature records used by Mann et al. in creating their hockeystick history, (2) degrading these pseudo-proxy records with statistical noise, (3) regressing the results against the measured temperatures of the historical record, and (4) using the relationships thus derived to construct a record they could compare against their original model-derived surrogate temperature history.

What was learned
Von Storch et al. report that the centennial variability of the NH temperature was underestimated by the regression-based methods they applied, suggesting, in their words, that past variations in real-world temperature "may have been at least a factor of two larger than indicated by empirical reconstructions." The unfortunate consequences of this result are readily evident in the reduced degree of Little Ice Age cooling and Medieval warming that result from the fault-prone techniques employed by Mann et al.

What it means
In an accompanying commentary, Osborn and Briffa (2004) state that "if the true natural variability of NH temperature is indeed greater than is currently accepted," which they appear to suggest is likely the case, "the extent to which recent warming can be viewed as 'unusual' would need to be reassessed." That this reassessment is sorely needed is also suggested by the fact that what von Storch et al. refer to as "empirical methods that explicitly aim to preserve low-frequency variability (Esper et al., 2002)" show much more extreme Medieval warming and Little Ice Age cooling than do the reconstructions of Mann et al., which suffer from the problems elucidated in this important new study of von Storch et al.

In light of these observations, it is becoming ever more evident that the temperature record of Esper et al. is likely to be much more representative of reality than is the IPCC-endorsed record of Mann et al., and that the lion's share of the warming experienced since the end of the Little Ice Age occurred well before mankind's CO2 emissions significantly perturbed the atmosphere, which indicates that the majority of post-Little Ice Age warming was due to something other than rising atmospheric CO2 concentrations -- which in turn suggests that the lesser warming of the latter part of the 20th century may well have been due to something else as well.

References
Esper, J., Cook, E.R. and Schweingruber, F.H. 2002. Low-frequency signals in long tree-ring chronologies for reconstructing past temperature variability. Science 295: 2250-2253.

Mann, M.E., Bradley, R.S. and Hughes, M.K. 1998. Global-scale temperature patterns and climate forcing over the past six centuries. Nature 392: 779-787.

Mann, M.E., Bradley, R.S. and Hughes, M.K. 1999. Northern Hemisphere temperatures during the past millennium: Inferences, uncertainties, and limitations. Geophysical Research Letters 26: 759-762.

Osborn, T.J. and Briffa, K.R. 2004. The real color of climate change? www.scienceexpress.org/ 30 September 2004.

Copyright © 2004. Center for the Study of Carbon Dioxide and Global Change

============
And now, some general background:
Ironies Abound In Hockey Stick Debacle
World Climate Alert, 6 October 2004
http://www.co2andclimate.org/wca/2004/wca_24c.html

Why are so many researchers concerned with reconstructing a thousand years of Earth's climate history? Some will argue it's actually a political debate; to the winner goes the spoils - passage of or withdrawal from the Kyoto Protocol by governments worldwide.

The 20th century indeed was warm. We know this to be so because temperatures could be measured using instruments designed for that purpose. What they indicate is that global temperature increased by about three-quarters of a degree Celsius. The question becomes: Was that rate of warming unusual in a longer-term context? We'll probably never be certain because there were no comparable instruments taking measurements in earlier centuries. Barring an unlikely discovery the Church operated a secret, well-calibrated global thermometric measurement during the Crusades, for example, any comparison of contemporary measurements with those of the past will be, by definition, fraught with error.

Climatic reconstructions are tempting because they offer insight to our climatological past. Such 'reconstructions' are assembled using all manner of less-than-ideal climate indicators, things like boreholes, tree rings, ice cores, and historical accounts (e.g. "General Washington requisitioned five new silk blouses after the summer of 1782 in comparison to only two the previous year, from which we reconstruct the following temperature and humidity time series..."). The key to the credibility of such reconstructions is to accurately appraise the size of the error associated with each temperature estimate.

A climate reconstruction assembled by University of Virginia Assistant Professor Michael Mann is a favorite of the Kyoto-friendly Intergovernmental Panel on Climate Change (IPCC). While many other Northern Hemisphere (NH) temperature reconstructions include a Medieval Warm Period in the 11th and 12th centuries, a Little Ice Age up to the 19th, with a few other short cool periods (all contributing considerable variability to the last millennium's climate), Mann's so-called 'hockey stick' indicates 900 years of approximately flat global temperature and a century at the end consisting of dramatically rising temperatures. This gives his graph its hockey-stick character with the steeply rising blade coinciding with a time when humans were adding lots of greenhouse gases to the atmosphere. The question arises: Were Northern Hemisphere temperatures between 1000 and 1900 really so non-variable? German climatologist Hans von Storch and several colleagues cast serious doubt upon the accuracy of Mann's reconstruction in a new paper in Science.

By way of background, climate reconstructions incorporate concurrent measurements from modern periods during which both instrumental and proxy indicators exist. Their relationship during the modern era (typically derived using statistical regression techniques) then is applied to proxy measurements from the past (when there were no instruments) to 'hindcast' climate.

What makes von Storch's research so interesting is that he and his colleagues cleverly work around the data unavailability problem using a climate model. Because their climate model is able to reasonably simulate the 20th century's temperature record (as to mean and variability), they selected local proxies from a 1,000-year run of the model at locations where paleo-climatologists like Mann claim to have proxy samples. They then add an appropriate amount of variability (or statistical 'noise') to each of the local samples to simulate the non-temperature component of actual proxy data.

When von Storch recreates his modeled thousand-year Northern Hemisphere temperature history based on the relationship between his pseudo-proxies and the modeled global temperatures during the 20th century (using a technique similar to that used by Mann), he finds Mann's pseudo-proxies greatly underestimate the amount of long-term variability. The more noise that is added, the less variability is captured. It is important that our readers understand that with the statistical approach employed by von Storch et al., it is not at all necessary for their climate model to be a perfectly accurate simulation of 20th century temperature. It need only be an acceptable representation of general temperature variability over the time period.

In a short "Perspective" in Science that evaluates von Storch's article, East Anglia's Timothy Osborn and Keith Briffa comment, "If the true natural variability of NH temperature is indeed greater than is currently accepted, the extent to which recent warming can be viewed as 'unusual' would need to be reassessed." In the larger context of climate politics, their phraseology ("currently accepted") becomes piquantly ironic. The clear implication is that Mann's temperature reconstruction (the one favored by the IPCC) is what the majority of the world's climatologists believe to be most accurate. If that is the case, then most of people studying climate must believe the Medieval Warm Period and the Little Ice Age to be minor excursions on a Northern Hemisphere temperature curve dominated by 20th century warming. The problem is, such thinking flies in the face of literally hundreds of research papers that document the existence and widespread impacts of the Medieval Warm Period and Little Ice Age.

Wouldn't it be nice if scientists could actually get together and compile evidence of those climatic events from the scientific literature? Guess what? Harvard scientists Willie Soon and Sallie Baliunas have done just that and published their results in the journal Climate Research, last year.

There were howls of protest by IPCC scientists, including Mann, claiming it to be necessary to change the editorial policies of Climate Research to prevent any future 'mishaps' of this kind. Von Storch was appointed Editor-in-Chief. His first, and ultimately only, act was to prepare an apology for Climate Research allowing the Soon and Baliunas review to see the light of day. This was quickly nixed by the journal's publisher; von Storch resigned in protest.

Here's the irony, the results von Storch just published in Science effectively agree with what Soon and Baliunas wrote in Climate Research in 2003: The Mann reconstruction severely underestimates past variability. So why did von Storch protest so vociferously when he was Editor-in-Chief? And why doesn't he reference Soon and Baliunas in his Science article, especially considering that their technique (i.e. comparative literature review) is an example of a method that is essentially free from the type of errors von Storch identified in the work of Mann and others?

This latest research effort further brings into question the accuracy of the Mann's temperature reconstruction (the hockey stick) and comes on the heels of another article by Steve McIntyre and Ross McKitrick in which they document difficulties they encountered recreating the Mann et al. temperature curve using Mann's own data. Their paper isn't cited by von Storch either. Their research was in turn preceded by research by Jan Esper et al. showing much greater long-term temperature variability (see http://www.co2andclimate.org/wca/2004/wca_15e.html for more information on Esper's findings). As evidence builds to support the long-standing paradigm that climate varies, it's time for the IPCC to follow the lead of National Hockey League players who with their threat to strike are willing to put their hockey sticks to rest in defense of a principle - in this instance not higher pay, but sound science.

References:
Esper J., D.C. Frank, and J.S. Wilson, 2004. Climate reconstructions: Low-frequency ambition and high-frequency ratification. Eos, 85, 133,120.

Esper, J., E.R. Cook, and F.H. Schweingruber, 2002. Low frequency signals in long tree-ring chronologies for reconstructing past temperature variability, Science, 295, 2250-2253.

Intergovernmental Panel on Climate Change, 2001. Climate Change 2001: The Scientific Basis. Houghton, J.T., et al., (eds.), Cambridge University Press, Cambridge, U.K, pp 881, http://www.grida.no/climate/ipcc_tar/wg1/index.htm.

Mann, M.E. R.S. Bradley, and M.K. Hughes, 1998. Global-scale temperature patterns and climate forcing over the past six centuries. Nature, 392, 779-787.

Mann, M.E., R.S. Bradley, and M.K. Hughes, 1999. Northern Hemisphere temperatures during the past millennium: inferences, uncertainties, and limitations. Geophysical Research Letters, 26, 759-762.

Mann, M.E., and P.D. Jones, 2003. Global surface temperature over the past two millennia. Geophysical Research Letters, doi:10.1029/2003GLo17814.

McIntyre, S., and R. McKitrick, 2003. Corrections to the Mann et. al. (1998) Proxy database and Northern Hemispheric average temperature series. Energy & Environment,14, 751-771.

Osborn, T.J., Briffa, K.R., 2004. The real color of climate change. Sciencexpress, September 30, 2004.

Soon, W., and S. Baliunas, 2003. Proxy climatic and environmental changes of the past 1,000 years. Climate Research, 23, 89-110.

Von Storch, H., et al., 2004. Reconstructing past climate from noisy data. Sciencexpress, September 30, 2004.
======================

And now, a comment in Nature ( Published online: 30 September 2004) | doi:10.1038/news040927-16)
Past climate change questioned
Swings in temperature might be more common than thought.

By Quirin Schiermeier

The Earth's temperature may have fluctuated more wildly during the past 1000 years than previously thought, according to a new study that challenges how researchers use tree rings and corals to give us a picture of the Earth's past.

If true, the study suggests that recent warming might not be as unique as was thought previously, and might partly be due to natural temperature cycles, rather than humans spewing carbon dioxide into the atmosphere.

To work out the planet's temperature during the past few hundred years, researchers often look at the width and density of annual rings in trees or the growth of corals. Such temperature indicators, known as proxies, are then used to construct average global temperatures.

But this method could be tainted by a systematic error, according to Hans von Storch, a climate modeller at the GKSS Institute for Coastal Research in Geesthacht, Germany, and his colleagues. Consequently researchers might have underestimated the size of temperature fluctuations from medieval times until the nineteenth century, by a factor of two or more.
******************************

And now, from a GW supporter
Models may underestimate climate swings
NewScientist.com news service 30 September 2004

The climate may have varied much more wildly in the past than reconstructions from tree-rings and ice-cores suggest, say climate scientists who have studied 1000 years of simulated data.

The findings by Hans von Storch from the GKSS Research Institute in Geesthacht, Germany and colleagues are provoking a heated dispute. While some scientists warn that their results imply climate changes in the future could be more dramatic than predicted, others argue that their methods are flawed.

To reach their controversial conclusion, von Storch and colleagues used a sophisticated computer model to simulate the Earth's climate over one millennium. They extracted records of temperature at particular locations and then added "noise" to the signal to mimic the kind of data that scientist can collect in the field.

For example, the thickness of tree rings is related to temperature, but it is also affected by factors such as moisture levels, or insect plagues, so there are large errors in this record. Next von Storch's team tried to reconstruct, from this noisy data, the temperature in the northern hemisphere for each year of the 1000-year simulation. They used a statistical method that other scientists, including Michael Mann at the University of Virginia in Charlottesville, Virginia, have employed.

Then they compared their estimated temperatures to those taken directly from the model. Although annual and decadal variation showed up in the reconstruction, in some cases "only 20% of the 100-year variability is recovered". In effect, reconstructing temperatures using the noisy data, did not show up long-term climate trends.

If past climate variability has been underestimated, then predictions for future fluctuations in the temperature might be too conservative, say experts.

"One of the conclusions we draw is that the climate's sensitivity might be higher, and therefore future climate change will be greater," says Timothy Osborn, an expert in climate variability at the University of East Anglia in Norwich, UK.

But Mann says the study is flawed, because the simulation the team uses churns out larger changes in climate than most scientists think are reasonable, putting the method to a more stringent test than is fair. "I was not asked to review the von Storch paper, which I consider unfortunate," he told New Scientist.
******************

And finally, from GW skeptic Dr Jarl Ahlbeck (Finland):

I don't think Michael Mann came up with his Hockey Stick just because he first discovered Medieval Warming and Little Ice Age in the data and then wanted to prove that there was no MW or LIA at all. He only put a lot of numbers into his computer, played with statistical subroutines for a while and wow! headline-breaking results popped up! Of course, he was excited and happy to become the man who was able to create a radical shift of paradigm! Good old [IPCC chairman] Bert Bolin was almost dancing with joy when he presented the Hockey Stick for the first time.

The problem was that nobody could repeat the calculations because. Mann did not provide exact information about the data files and computing procedures. So much for the referee-process. The referees judge the quality of the work according to their personal opinions about what may be true or not, not by double-checking the calculations because it was not possible. How can. Mann claim that his greenhouse-believer fellow von. Storch has made flawed calculations if he cannot repeat them? He looked at the assumptions and does not accept them, he says. But it may be the results that he does not like.

To be honest, the way v. Storch plays with climate models does not convince me either. The downplaying of the Hockey Stick is nice but the way it was done may simply be bad science due to the fact that the climate models do not reliably model the real climate.
I have seen (and made) so much junk science during my 30 years of work with chemical process analysis by statistical methods, so I have become a typical sceptic on almost everything. My processes usually offered a lot of surprises that were never modelled. And they were very simple compared to the World Climate. (Of course I don't believe in the Hockey Stick).

In fact global and NH mean temperatures are not very interesting. Nobody lives in the mean temperature, it is the local temperature at a certain time that matters.

Greenhouse religion is a funny thing. There are so much feelings and aggressions in it. Just as in classical religions.
**************************


7. Consensus Can Be Bad For Climate Science
by Willie Soon and Sallie Baliunas
Scripps Howard News Service, 27-Sept-04

British Prime Minister Tony Blair recently grumbled about the failure of countries to reach agreement on scientific evidence of the human-made portion of global warming, which he views as incontrovertible and disastrous.

A Whitehall spokesperson was quoted: "Until we get consensus on the science we will never get a consensus on the rapid action needed before it's too late."

Proponents of dramatic action on climate are calling for "consensus" among scientists on the issue of global climate change in order to convince policymakers that the United States should immediately and sharply curb carbon dioxide and other greenhouse gas emissions as the promised means to prevent disastrous climate change.

Scientific agreement, though, differs distinctly from consensus wrought by social-political pressures. Efforts to force a consensus are pernicious to science. For the body of evidence and facts on which scientists agree - as currently known - must always be challengeable by new information. That is the basis of the scientific method.

An upcoming journal paper in Environmental Science & Policy sheds some light on the distortion of climate science by "consensus" politics. Daniel Sarewitz of Arizona State University, who was on a panel that authored a 2003 climate report for the National Academies of Sciences' National Research Council (NRC), provides an inside view of the NRC report's publication process, and details what outsiders may get as "consensus." It isn't what most people would expect from a scientific body.

The NRC report introduced the need for continued research: "The consensus view of scientists is that human activities are changing Earth's climate and that this could have very important consequences for human life and natural ecosystems."

That made sense. Most scientists agree that carbon-dioxide concentration has increased by approximately 30% in 200 years because of human activities. And they agree that, all things being equal, the addition of greenhouse gases to the air should cause some warming.

The big questions, which can be answered only by science, are: How much warming and by when? To get accurate answers - pinning down the meaning of "could have very important consequences" - involves tackling uncertainties in climate forecasts.

Thus, the charge to the NRC panelists was also sensible, as it was embodied by the title of the draft report: "Climate Change Feedbacks: Characterizing and Reducing Uncertainties." Climate is a complex and non-linear system, involving "feedbacks" - for example, clouds, ocean currents, plants - that may amplify or dampen initial perturbations into disproportionate outcomes - large or small. Many of those physical processes are highly uncertain, and reducing uncertainty is essential to gaining accurate climate forecasts.

But along the way, discussion on research uncertainties was shifted to an insistence that science should look as tidy as possible - a social consensus. As Sarewitz notes, the final NRC report title omitted the highlight on uncertainties and read merely "Understanding Climate Change Feedbacks."

One certainty in science is that all its results are uncertain and achieving better accuracy means reducing uncertainty, but always within limits. Progress comes from tension among hypotheses, equipment and instruments for observing and measuring, experimental results and evidence, with scientists acting as unique mental rebels. Columbia University sociologist Robert K. Merton (1910-2003) noted, "Most institutions demand unqualified faith; but the institution of science makes skepticism a virtue."

The convenience of quiet consensus is in opposition to the structured, skeptical rebelliousness of the scientific method, which strikes at uncertainty by trying to reduce it.

Michael Crichton, author of Jurassic Park and creator-producer of ER, recently remarked, "Consensus is the business of politics. If it's consensus, it isn't science. If it's science, it isn't consensus. Period."

Right now tremendous uncertainty exists in answering important questions about climate. Physical law is not made by social consensus, only by scientific evidence, which comes from acknowledging those things we know as uncertain or even unknown - not accepting as incontrovertible that which is still unproven.
*************************************


8. Britain Plans Contentious Approach To Climate-Change Policies.
PM Tony Blair's speech (Sept 14) on Climate Change

The Prime Minister called climate change the world's greatest environmental challenge in a speech in London.
Full text online <http://www.number-10.gov.uk/output/Page6333.asp>
We copy here certain extracts:

"Let me summarise the evidence:

- The 10 warmest years on record have all been since 1990. Over the last century average global temperatures have risen by 0.6 degrees Celsius: the most drastic temperature rise for over 1,000 years in the northern hemisphere."
.
.
.
...........................

"Prior to the G8 meeting itself we propose first to host an international scientific meeting at the Hadley Centre for Climate Prediction and Research in Exeter in February. More than just another scientific conference, this gathering will address the big questions on which we need to pool the answers available from the science:

- "What level of greenhouse gases in the atmosphere is self-evidently too much?" and
- "What options do we have to avoid such levels?"
……………………

"The situation therefore can be summarised in this way:
1 If what the science tells us about climate change is correct, then unabated it will result in catastrophic consequences for our world.
2 The science, almost certainly, is correct.
3 Recent experience teaches us that it is possible to combine reducing emissions with economic growth.
4 Further investment in science and technology and in the businesses associated with it has the potential to transform the possibilities of such a healthy combination of sustainability and development.
5 To acquire global leadership, on this issue Britain must demonstrate it first at home.
6 The G8 next year, and the EU Presidency provide a great opportunity to push this debate to a new and better level that, after the discord over Kyoto, offers the prospect of agreement and action."
======================

Political Action on Climate Change:
Lord Lawson, Wilfred Beckerman, Ian Byatt, David Henderson, Julian Morris, Alan Peacock, Colin Robinson
Letter, The Times (London), Sept 24, 2004

Sir, Both the Prime Minister and the Leader of the Opposition made major speeches last week on climate change and the policies that are supposedly required to deal with it (reports, September 14 and 15). It appears that, in this area, Tony Blair and Michael Howard are of one mind. They hold the same alarmist view of the world, and call for much the same radical - and costly - programme of action.

Both leaders assert that prospective climate change, arising from human activity, clearly poses a grave and imminent threat to the world. Such statements give too much credence to some current sombre assessments and dark scenarios, and pay no heed to the great uncertainties that still prevail in relation to the causes and consequences of climate change. There are no solid grounds for assuming, as Messrs Blair and Howard do, that global warming demands immediate and far-reaching action.

The actions that they call for chiefly comprise a range of higher targeted subsidies, and of stricter controls and regulations, to limit CO2 emissions. These measures would raise costs for enterprises and households, both directly as consumers and as taxpayers. They would make all of us significantly and increasingly worse off. There are no worthwhile gains to set against these costs. It is absurd to argue, as the Prime Minister did in his speech (and Howard took a similar line), that such policies can "unleash a new and benign commercial force". The new opportunities created for high-cost ventures come as the direct result of suppressing opportunities for their lower-cost rivals: this is already happening in power generation.

It is not only the Prime Minister and Mr Howard who are advancing questionable economic arguments. We consider that the treatment of economic issues by the Intergovernmental Panel on Climate Change is not up to the mark. It is time for finance and economics ministries everywhere, including HM Treasury, to wake up to this situation and take action.

Yours faithfully,
LAWSON of BLABY,
WILFRED BECKERMAN
(Emeritus Fellow, Balliol College, Oxford),
IAN BYATT
(Director-General of Water Services, 1989-2000),
DAVID HENDERSON
(Visiting Professor, Westminster Business School),
JULIAN MORRIS
(Executive Director, International PolicyNetwork),
ALAN PEACOCK
(David Hume Institute, Edinburgh),
COLIN ROBINSON
(Emeritus Professor of Economics, University of Surrey),
c/o Westminster Business School,
35 Marylebone Road, NW1 5LS.
=============

Blair To Seek Consensus On Safe Greenhouse-Gas Levels
News, Nature online: 06 October 2004; | doi:10.1038/431619a

London - Tony Blair, the British prime minister, plans to adopt a controversial new approach to international negotiations on climate change, according to UK scientists.

The approach, which his government is expected to announce later this month, would ask world leaders to seek agreement on an acceptable target level for the concentration of greenhouse gases in the atmosphere. But climate-change experts in Britain have expressed concern that such a strategy could dilute existing attempts to cut emissions of greenhouse gases, through the implementation of the Kyoto Protocol (see Nature page 613 <http://www.nature.com/uidfinder/10.1038/431613a> ).

With Blair hosting a meeting of the Group of Eight industrialized nations (G8) at Gleneagles in Scotland next July, and Britain holding the rotating presidency of the European Union for six months after that, the prime minister wants to provide some global impetus towards action on climate change.

Blair's initiative would get government leaders to work out how they could declare a level at which atmospheric greenhouse-gas concentrations would become "dangerous", say researchers who have discussed the idea with UK government officials. Supporters of the idea believe that discussion of such a long-term limit could help break the deadlock between countries that have ratified the Kyoto Protocol and others, led by the United States, that have rejected it.

UK scientists who have been consulted on the plan welcome Blair's focus on climate change, but warn that it will be hard to reach political or scientific agreement over what constitutes a dangerous level of greenhouse gases. They say that they have expressed their concerns during consultations for a government-run conference on greenhouse-gas stabilization that will take place next February at the Hadley Centre for Climate Prediction and Research in Exeter.

"Any attempt to launch negotiations on a target would be extremely dangerous and misguided," says Michael Grubb, a specialist in climate-change policy at Imperial College London. He says he received a "stony reception" from environment ministry officials when he made this point to them last month at a meeting at the Tyndall Centre for Climate Change Research in Norwich.

Grubb and others say that consideration of such a greenhouse-gas target will lead different countries in divergent directions, with each one looking at local problems, such as the effect of climate change on staple crops or on the frequency of summer heat waves. David Griggs, director of the Hadley Centre - who has also been consulted by the environment ministry - says he fears that nations will find little common ground for identifying such a target.

And Mike Hulme, director of the Tyndall Centre, who is due to meet with ministry officials next week, says that the idea of a global target for greenhouse gases is too distant from people's immediate concerns about the impact of climate change. "It's too remote" he says. "It's not good at changing people's behaviour."

David Warrilow, a ministry official who is involved in the consultation, says that the British government wants other nations to "start to think about progress" towards setting a stabilization level, rather than establishing a firm target. Such debate could feed into future Kyoto Protocol negotiations, he adds.

But some climate-change analysts say the issue will simply divert attention from the pressing need to control emissions of greenhouse gases. "We could invest an enormous amount of time in a fruitless exercise," says Elliot Diringer, director of international strategy at the Pew Center on Global Climate Change in Washington, "rather than focussing on what can be done now."

Diringer agrees that the scope of the Kyoto Protocol should be broadened during its second phase, but he suggests that any new targets should focus on variables over which nations have direct control, such as levels of energy efficiency or the capture and storage of carbon dioxide.
--------------------------

SEPP Comments:

1. The FCCC (Climate Treaty), article 2, says "dangerous to the climate system" -- not to crops, sea level, or butterflies. As I have pointed out in various publications, the historical evidence indicates that the climate system becomes less unstable during warm periods .

2. The radiative forcing from CO2 increases only as the logarithm of its atmospheric concentration (because of near-saturation in the 15-micron absorption bands). This means that a concentration of 4 times pre-industrial is not twice as effective as a doubled value but only 70% greater.

3. By the same token, since we are now about 50% above pre-industrial forcing, with a human GH contribution of not more than perhaps 0.3 degC ( and probably less), going to a doubling (with mostly CO2) will raise temperature by at most 0.21 degC additionally.
=========

We digress: From a von Storch interview in Die Welt

Q: How will the Kyoto Protocol affect the climate?

v. Storch: Practically not at all. The important thing is its psychological effect.

*********************

9. An Admiral At The Helm Of NOAA
Wash Post Oct 5, 2004

Conrad Lautenbacher, the man the U.S. government has put in charge of the weather, doesn't seem worried a bit. He's got the cool nerves of a technocrat, the heart of a sailor. He's the administrator of the National Oceanic and Atmospheric Administration (NOAA), has commanded ships as a vice admiral in the Navy, has a doctorate in applied mathematics from Harvard, and he isn't about to be intimidated by a little wind and rain and the occasional growling volcano.

He is an adaptationist.

"In the winter we heat our houses, in the summer we run around in T-shirts. . . . It's why we're the most invasive species on the planet," he says.

Adaptation is the can-do response to the perils of climate change and other global upheavals (disease, invasive species, resource crises, etc.). Lautenbacher assumes we'll find a way to survive and thrive, though it will take hard work, and ingenuity, and perhaps a passel of new satellites and sensors that can help us understand the Earth.

"I want to wire the world. We want to be able to give the world an MRI. We want to take the pulse of the planet," he says.

This is a dramatic shift from the historic strategy in regards to natural disasters, which was to pray a lot. Catastrophes were seen as acts of God. Today they are more likely to be seen as cyclical and somewhat predictable events whose effects can be minimized. Death tolls from such things as earthquakes and hurricanes have dropped dramatically in the developed world. Homes are sturdier, flood control is more advanced, and we are less likely to see the kind of disaster that befell South Florida in 1928, when a hurricane drowned more than 2,000 people in the towns along Lake Okeechobee, with some victims frantically climbing trees only to be bitten by the water moccasins who also took refuge there.

The highest risk factor in disasters is being poor. The four storms that hit Florida may have destroyed thousands of mobile homes, but they killed fewer than 100 people. Hurricane Jeanne by itself killed about 2,000 people in Haiti, and left an additional 300,000 homeless. The Haitian countryside, denuded of forest, proved no match for the inundating rains, and the residents in their hovels watched helplessly as their surroundings turned into a torrent of water and mud and debris.

FULL ARTICLE at http://www.washingtonpost.com/wp-dyn/articles/A10072-2004Oct5.html

 

 



Go to the Week That Was Index